Early Access: The content on this website is provided for informational purposes only in connection with pre-General Availability Qlik Products.
All content is subject to change and is provided without warranty.
Skip to main content

Modifying data mart settings

You can modify the data mart settings according to your needs.

  1. In the Manage Data Marts window, select a data mart and click Settings.

    The Setting - Data Mart Name window opens. In the General tab, the following settings are available:

    • Log level: Select the log level granularity, which can be any of the following:
      • INFO (default) - Logs informational messages that highlight the progress of the ETL process at a coarse-grained level.
      • VERBOSE - Logs fine-grained informational events that are most useful to debug the ETL process.
      • TRACE - Logs finer-grained informational events than the VERBOSE level.

      The log levels VERBOSE and TRACE impact performance. Therefore, you should only select them for troubleshooting if advised by Qlik Support.

    • Load Type: Select Full rebuild to build the data mart from scratch each time or Incremental loading (default) to only load changes.

      Information note

      Incremental loading is not available for Aggregated or State Oriented fact tables.

    • Create tables in database - By default, data mart tables are created in the database specified in the data warehouse connection settings. Optionally, click the browse button and select a different database.

      Information note

      This option is only available for Microsoft SQL Server.

    • Create tables in schema - By default, data mart tables are created in the schema specified in the data warehouse connection settings. Optionally, specify a different schema, either by typing the schema name or by clicking the browse button and selecting one of the existing schemas. If you specify the name of a non-existing schema, Compose will create the schema automatically.

      Information note

      This option is only available for Microsoft SQL Server, Amazon Redshift, Snowflake, and Microsoft Azure Synapse Analytics.

    • Create views in schema - By default, data mart views are created in the schema specified in the data warehouse connection settings. Optionally, specify a different schema, either by typing the schema name or by clicking the browse button and selecting one of the existing schemas. If you specify the name of a non-existing schema, Compose will create the schema automatically, unless the database is Oracle.

      Information note

      If the view schema is different from the data mart schema, you need to grant the following permission:

      Grant SELECT on DM_SCHEMA to DM_VIEW_SCHEMA WITH GRANT OPTION

    See also Data mart views.

  2. In the Advanced tab, the following settings are available:

    • Sequential Processing: Select this option if you want to run all the data mart tasks sequentially, even if they can be run in parallel. This can be good for debugging or profiling.
    • JVM memory settings: Edit the memory for the java virtual machine (JVM) if you experience performance issues. Xms is the minimum memory; Xmx is the maximum memory. The JVM starts running with the Xms value and can use up to the Xmx value.

      Information note

      Only the following characters are supported (shown as a regular expression):

      /^[-a-zA-Z0-9:]*$/

    • Position in Default Workflow: Set the position you want the data mart to appear in the default workflow. For more information on workflows, see Workflows.
    • Optimize for initial load: This option is not applicable to the State Oriented and Aggregated fact types. If the "Incremental Loading" option is enabled (the default), clear the "Optimize for initial load" option after the initial load task completes and regenerate the Data Mart task. If the "Full Rebuild" option is enabled, selecting "Optimize for initial load" may accelerate the loading process.
    • Write task statement duration to the TLOG_PROCLOG table in the data warehouse: This option is useful for troubleshooting performance issues with task statements as it records the duration of each task statement in a special table (named TLOG_PROCLOG) in the data warehouse. You can then use this information to locate task statements with abnormal duration times and modify them accordingly.
    • Do not drop temporary tables: Select this option if you want to keep the temporary tables created during the task. Only use for debugging.
    • Enable table logging: This option is available for Oracle only. When enabled, the data mart tables will be created with the Oracle LOGGING option enabled. Leaving this option unselected (the default) should improve performance, but in some cases DML operations will not be recorded in the redo log file. For more information on this option, refer to the Oracle online documentation.
  3. Click OK.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!